DE eng

Search in the Catalogues and Directories

Hits 1 – 14 of 14

1
Higher-order Derivatives of Weighted Finite-state Machines ...
BASE
Show details
2
On Finding the K-best Non-projective Dependency Trees ...
BASE
Show details
3
On Finding the K-best Non-projective Dependency Trees ...
Zmigrod, Ran; Vieira, Tim; Cotterell, Ryan. - : ETH Zurich, 2021
BASE
Show details
4
Efficient computation of expectations under spanning tree distributions ...
Zmigrod, Ran; Vieira, Tim; Cotterell, Ryan. - : ETH Zurich, 2021
BASE
Show details
5
Higher-order Derivatives of Weighted Finite-state Machines ...
Zmigrod, Ran; Vieira, Tim; Cotterell, Ryan. - : ETH Zurich, 2021
BASE
Show details
6
On Finding the K-best Non-projective Dependency Trees
In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (2021)
BASE
Show details
7
Higher-order Derivatives of Weighted Finite-state Machines
In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (2021)
BASE
Show details
8
Efficient computation of expectations under spanning tree distributions
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
9
Efficient Sampling of Dependency Structure
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
10
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection ...
BASE
Show details
11
Information-Theoretic Probing for Linguistic Structure ...
BASE
Show details
12
Information-Theoretic Probing for Linguistic Structure ...
Abstract: The success of neural networks on a diverse set of NLP tasks has led researchers to question how much these networks actually "know" about natural language. Probes are a natural way of assessing this. When probing, a researcher chooses a linguistic task and trains a supervised model to predict annotations in that linguistic task from the network's learned representations. If the probe does well, the researcher may conclude that the representations encode knowledge related to the task. A commonly held belief is that using simpler models as probes is better; the logic is that simpler models will identify linguistic structure, but not learn the task itself. We propose an information-theoretic operationalization of probing as estimating mutual information that contradicts this received wisdom: one should always select the highest performing probe one can, even if it is more complex, since it will result in a tighter estimate, and thus reveal more of the linguistic information inherent in the representation. The ... : Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics ...
URL: http://hdl.handle.net/20.500.11850/446005
https://dx.doi.org/10.3929/ethz-b-000446005
BASE
Hide details
13
Please Mind the Root: Decoding Arborescences for Dependency Parsing
In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (2020)
BASE
Show details
14
Information-Theoretic Probing for Linguistic Structure
In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
14
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern